Path: blob/master/Part 10 - Model Selection And Boosting/XGBoost/[R] XGBoost.ipynb
1350 views
Kernel: R
XGBoost
Data preprocessing
In [1]:
In [2]:
Out[2]:
In [3]:
In [4]:
Out[4]:
In [5]:
In [6]:
Fitting XGBoost to the Training set
In [7]:
In [18]:
Out[18]:
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
Its better to think that the converging point is 0.30. But it is worth noting that we can decrease the train-rmse(which stands for root mean square deviation) if we keep on inscreasing nrounds. I tried it myself but the result will be too long to display here. Better try it by running this (below code is just for cross-validation to assess model performance):
Also if you don't want to display train-rmse then pass verbose = 0 as a parameter.
Applying k-Fold Cross Validation
In [9]:
Out[9]:
Loading required package: lattice
Loading required package: ggplot2
In [10]:
Out[10]:
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
[1] train-rmse:0.417976
[2] train-rmse:0.369643
[3] train-rmse:0.341009
[4] train-rmse:0.325538
[5] train-rmse:0.316370
[6] train-rmse:0.309533
[7] train-rmse:0.306012
[8] train-rmse:0.302703
[9] train-rmse:0.300868
[10] train-rmse:0.298456
In [11]:
Out[11]:
In [12]:
Out[12]: